-
Notifications
You must be signed in to change notification settings - Fork 0
Removal of partial tool call streaming #3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Removal of partial tool call streaming #3
Conversation
… package and update imports across modules. Add coroutines dependency to `prompt-model`.
# Conflicts: # prompt/prompt-executor/prompt-executor-clients/prompt-executor-anthropic-client/src/commonMain/kotlin/ai/koog/prompt/executor/clients/anthropic/AnthropicLLMClient.kt # prompt/prompt-executor/prompt-executor-clients/prompt-executor-google-client/src/commonMain/kotlin/ai/koog/prompt/executor/clients/google/GoogleLLMClient.kt # prompt/prompt-executor/prompt-executor-clients/prompt-executor-openai-client/src/commonMain/kotlin/ai/koog/prompt/executor/clients/openai/OpenAILLMClient.kt # prompt/prompt-model/src/commonMain/kotlin/ai/koog/prompt/streaming/StreamFrameExt.kt
…put_item.done` type from official OpenAI API
rubencagnie-toast
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The changes on the OpenAILLMClient didn't work as not all tool call chunks were emitted.
I just pushed a StreamingAgentWithTools.kt in the examples folder to my branch. Maybe you can try it out yourself to validate the changes
...penai-client/src/commonMain/kotlin/ai/koog/prompt/executor/clients/openai/OpenAILLMClient.kt
Outdated
Show resolved
Hide resolved
…ools-ruben-partial
…rocessing and tool call handling.
|
How is the current design u think @rubencagnie-toast , I am not sure if my assumptions are correct, but the tests pass, and I tried it with ur openAI example and it works. |
rubencagnie-toast
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really like this new approach! Just added some small questions
prompt/prompt-model/src/commonMain/kotlin/ai/koog/prompt/streaming/StreamFrameExt.kt
Outdated
Show resolved
Hide resolved
prompt/prompt-model/src/commonMain/kotlin/ai/koog/prompt/streaming/StreamFrameExt.kt
Outdated
Show resolved
Hide resolved
| streamFrameFlow { | ||
| val builder = StreamFrameFlowBuilder(this) | ||
| block(builder) | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we add a builder. tryEmitPendingToolCall() just to be safe (in case emitEnd wasn't called properly)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was thinking this too yea, but then I was also thinking if emitEnd wasn't called then that probably means the stream was cancelled prematurely for some reason and the pending tool call is most likely incomplete in that case.
But it all depends on whether apis guarantee to always send a finishReason at some point, which I think it does from reading the docs. But not 100% sure.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work! Feel free to merge!
Aight, checks are failing so I cannot merge it in here. Have yet to update some docs and probably style issues. EDIT: okay I merged the changes in |
Sound perfect. I'll close this PR and we can work from yours as the base! Thanks for the collaboration, I think this is a great change to the framework! |
Using the openai
"type":"response.output_item.done"to only support streaming of completed tool calls.